A Kullback-leibler Distance Approach to System Identification
نویسنده
چکیده
The use of probability in system identification is shown to be equivalent to measuring Kullback-Leibler distance between the actual (empirical) and model distributions of data. When data are not known completely (being compressed, quantized, aggregated, missing etc.), the minimum distance approach can be seen as an asymptotic approximation of probabilistic inference. A class of problems is pointed out where inference via Kullback-Leibler distance brings an attractive, computationally less demanding alternative to maximum likelihood or Bayesian estimation.
منابع مشابه
Using Kullback-Leibler distance for performance evaluation of search designs
This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملComparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملDiscriminant Analysis for ARMA Models Based on Divergency Criterion: A Frequency Domain Approach
The extension of classical analysis to time series data is the basic problem faced in many fields, such as engineering, economic and medicine. The main objective of discriminant time series analysis is to examine how far it is possible to distinguish between various groups. There are two situations to be considered in the linear time series models. Firstly when the main discriminatory informati...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کامل